|
Group method of data handling (GMDH) is a family of inductive algorithms for computer-based mathematical modeling of multi-parametric datasets that features fully automatic structural and parametric optimization of models. GMDH is used in such fields as data mining, knowledge discovery, prediction, complex systems modeling, optimization and pattern recognition. GMDH algorithms are characterized by inductive procedure that performs sorting-out of gradually complicated polynomial models and selecting the best solution by means of the so-called ''external criterion''. A GMDH model with multiple inputs and one output is a subset of components of the ''base function'' (1): : where ''f'' are elementary functions dependent on different sets of inputs, ''a'' are coefficients and ''m'' is the number of the base function components. In order to find the best solution GMDH algorithms consider various component subsets of the base function (1) called ''partial models''. Coefficients of these models are estimated by the least squares method. GMDH algorithms gradually increase the number of partial model components and find a model structure with optimal complexity indicated by the minimum value of an ''external criterion''. This process is called self-organization of models. The most popular base function used in GMDH is the gradually complicated Kolmogorov-Gabor polynomial (2): : The resulting models are also known as polynomial neural networks. Jürgen Schmidhuber cites GDMH as one of the earliest deep learning methods, remarking that it was used to train eight-layer neural nets as early as 1971. ==History== The method was originated in 1968 by Prof. Alexey G. Ivakhnenko in the Institute of Cybernetics in Kiev (then in the Ukrainian SSR). This approach from the very beginning was a computer-based method so, a set of computer programs and algorithms were the primary practical results achieved at the base of the new theoretical principles. Thanks to the author's policy of open code sharing the method was quickly settled in the large number of scientific laboratories world wide. At that time code sharing was quite a physical action since the Internet is at least 5 years younger than GMDH. Despite this fact the first investigation of GMDH outside the Soviet Union had been made soon by R.Shankar in 1972. Later on different GMDH variants were published by Japanese and Polish scientists. Period 1968-1971 is characterized by application of only regularity criterion for solving of the problems of identification, pattern recognition and short-term forecasting. As reference functions polynomials, logical nets, fuzzy Zadeh sets and Bayes probability formulas were used. Authors were stimulated by very high accuracy of forecasting with the new approach. Noiseimmunity was not investigated. Period 1972-1975. The problem of modeling of noised data and incomplete information basis was solved. Multicriteria selection and utilization of additional priory information for noiseimmunity increasing were proposed. Best experiments showed that with extended definition of the optimal model by additional criterion noise level can be ten times more than signal. Then it was improved using Shannon's Theorem of General Communication theory. Period 1976-1979. The convergence of multilayered GMDH algorithms was investigated. It was shown that some multilayered algorithms have "multilayerness error" - analogous to static error of control systems. In 1977 a solution of objective systems analysis problems by multilayered GMDH algorithms was proposed. It turned out that sorting-out by criteria ensemble finds the only optimal system of equations and therefore to show complex object elements, their main input and output variables. Period 1980-1988. Many important theoretical results were received. It became clear that full physical models cannot be used for long-term forecasting. It was proved, that non-physical models of GMDH are more accurate for approximation and forecast than physical models of regression analysis. Two-level algorithms which use two different time scales for modeling were developed. Since 1989 the new algorithms (AC, OCC, PF) for non-parametric modeling of fuzzy objects and SLP for expert systems were developed and investigated. Present stage of GMDH development can be described as blossom out of twice-multilayered neuronets and parallel combinatorial algorithms for multiprocessor computers. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Group method of data handling」の詳細全文を読む スポンサード リンク
|